Deepfake porn photos of Taylor Swift: What Microsoft CEO Satya Nadella has to say

Deepfake porn photos of Taylor Swift: What Microsoft CEO Satya Nadella has to say
Deepfake porn photos of Taylor Swift: What Microsoft CEO Satya Nadella has to say

[ad_1]

AI-generated images of Taylor Swift emerged online recently. Microsoft tools may reportedly be connected to the fake Taylor Swift images. A 404 Media report claims that members of “Telegram group dedicated to abusive images of women” used Microsoft’s AI-powered image creation tool to make the fake photographs of Swift.
Microsoft recently renamed Bing Image Creator to Image Creator from Designer. The 404 Media report has not specified which tool was used by members of the Telegram group, but there’s a good chance it was Bing Image Creator/Image Creator from Designer. Unlike more conventional doctored images that have troubled celebrities in the past, the Swift images appear to have been created using an artificial intelligence image-generator that can instantly create new images from a written prompt.
What Satya Nadella said
Microsoft CEO Satya Nadella discussed the situation with NBC Nightly News in an interview. He talked about what can be done to guide AI and guard against misuse. Nadella said in his interview, “I would say two things. One, is, again, I go back to, I think, what’s our responsibility, which is all of the guardrails that we need to place around the technology so that there’s more safe content that’s being produced. And there’s a lot to be done and a lot being done there. And we can do, especially when you have law and law enforcement and tech platforms that can come together, I think we can govern a lot more than we think- we give ourselves credit for.”
Most commercial AI image-generators have safeguards to prevent abuse, but commenters on anonymous message boards discussed tactics for how to circumvent the moderation, especially on Microsoft Designer’s text-to-image tool, Decker said.
How the images went viral
Sexually explicit and abusive fake images of Taylor Swift started circulating widely last week on social media platform X, formerly Twitter. These deepfake porn images of Taylor Swift has made her the most famous victim of a scourge that tech platforms and anti-abuse groups have been struggling to resolve. The Swift images reportedly first emerged from an ongoing campaign that began last year on fringe platforms to produce sexually explicit AI-generated images of celebrity women, told Ben Decker, founder of the threat intelligence group Memetica, to news agency AP. One of the Swift images that went viral is said to have appeared online as early as January 6. The researchers found at least a couple dozen unique AI-generated images. The most widely shared were football-related, showing a painted or bloodied Swift that objectified her and in some cases inflicted violent harm on her deepfake persona.



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *